6 research outputs found
Contaminant source localization via Bayesian global optimization
Contaminant source localization problems require efficient and
robust methods that can account for geological heterogeneities and accommodate
relatively small data sets of noisy observations. As realism commands
hi-fidelity simulations, computation costs call for global optimization
algorithms under parsimonious evaluation budgets. Bayesian optimization
approaches are well adapted to such settings as they allow the exploration of
parameter spaces in a principled way so as to iteratively locate the point(s)
of global optimum while maintaining an approximation of the objective
function with an instrumental quantification of prediction uncertainty. Here,
we adapt a Bayesian optimization approach to localize a contaminant source in
a discretized spatial domain. We thus demonstrate the potential of such a
method for hydrogeological applications and also provide test cases for the
optimization community. The localization problem is illustrated for cases
where the geology is assumed to be perfectly known. Two 2-D synthetic cases
that display sharp hydraulic conductivity contrasts and specific connectivity
patterns are investigated. These cases generate highly nonlinear objective
functions that present multiple local minima. A derivative-free global
optimization algorithm relying on a Gaussian process model and on the
expected improvement criterion is used to efficiently localize the point of
minimum of the objective functions, which corresponds to the contaminant
source location. Even though concentration measurements contain a significant
level of proportional noise, the algorithm efficiently localizes the
contaminant source location. The variations of the objective function are
essentially driven by the geology, followed by the design of the monitoring
well network. The data and scripts used to generate objective functions are
shared to favor reproducible research. This contribution is important
because the functions present multiple local minima and are inspired from a
practical field application. Sharing these complex objective functions
provides a source of test cases for global optimization benchmarks and should
help with designing new and efficient methods to solve this type of problem.</p
SQG-Differential Evolution for difficult optimization problems under a tight function evaluation budget
In the context of industrial engineering, it is important to integrate
efficient computational optimization methods in the product development
process. Some of the most challenging simulation-based engineering design
optimization problems are characterized by: a large number of design variables,
the absence of analytical gradients, highly non-linear objectives and a limited
function evaluation budget. Although a huge variety of different optimization
algorithms is available, the development and selection of efficient algorithms
for problems with these industrial relevant characteristics, remains a
challenge. In this communication, a hybrid variant of Differential Evolution
(DE) is introduced which combines aspects of Stochastic Quasi-Gradient (SQG)
methods within the framework of DE, in order to improve optimization efficiency
on problems with the previously mentioned characteristics. The performance of
the resulting derivative-free algorithm is compared with other state-of-the-art
DE variants on 25 commonly used benchmark functions, under tight function
evaluation budget constraints of 1000 evaluations. The experimental results
indicate that the new algorithm performs excellent on the 'difficult' (high
dimensional, multi-modal, inseparable) test functions. The operations used in
the proposed mutation scheme, are computationally inexpensive, and can be
easily implemented in existing differential evolution variants or other
population-based optimization algorithms by a few lines of program code as an
non-invasive optional setting. Besides the applicability of the presented
algorithm by itself, the described concepts can serve as a useful and
interesting addition to the algorithmic operators in the frameworks of
heuristics and evolutionary optimization and computing